Does $\ell _{p}$ -Minimization Outperform $\ell _{1}$ -Minimization?

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recovery of sparsest signals via $\ell^q$-minimization

In this paper, it is proved that every s-sparse vector x ∈ R can be exactly recovered from the measurement vector z = Ax ∈ R via some l-minimization with 0 < q ≤ 1, as soon as each s-sparse vector x ∈ R n is uniquely determined by the measurement z.

متن کامل

Robustness of classifiers to uniform $\ell\_p$ and Gaussian noise

We study the robustness of classifiers to various kinds of random noise models. In particular, we consider noise drawn uniformly from the `p ball for p ∈ [1,∞] and Gaussian noise with an arbitrary covariance matrix. We characterize this robustness to random noise in terms of the distance to the decision boundary of the classifier. This analysis applies to linear classifiers as well as classifie...

متن کامل

$\ell^1$-Analysis Minimization and Generalized (Co-)Sparsity: When Does Recovery Succeed?

This paper investigates the problem of signal estimation from undersampled noisy sub-Gaussian measurements under the assumption of a cosparse model. Based on generalized notions of sparsity, we derive novel recovery guarantees for the `1-analysis basis pursuit, enabling highly accurate predictions of its sample complexity. The corresponding bounds on the number of required measurements do expli...

متن کامل

A $2\ell k$ Kernel for $\ell$-Component Order Connectivity

In the `-Component Order Connectivity problem (` ∈ N), we are given a graph G on n vertices, m edges and a non-negative integer k and asks whether there exists a set of vertices S ⊆ V (G) such that |S| ≤ k and the size of the largest connected component in G−S is at most `. In this paper, we give a kernel for `-Component Order Connectivity with at most 2`k vertices that takes nO(`) time for eve...

متن کامل

Performance of first- and second-order methods for \(\ell _1\) -regularized least squares problems

We study the performance of firstand second-order optimization methods for `1-regularized sparse least-squares problems as the conditioning of the problem changes and the dimensions of the problem increase up to one trillion. A rigorously defined generator is presented which allows control of the dimensions, the conditioning and the sparsity of the problem. The generator has very low memory req...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2017

ISSN: 0018-9448,1557-9654

DOI: 10.1109/tit.2017.2717585